Proximal Iteratively Reweighted Algorithm with Multiple Splitting for Nonconvex Sparsity Optimization

نویسندگان

  • Canyi Lu
  • Yunchao Wei
  • Zhouchen Lin
  • Shuicheng Yan
چکیده

This paper proposes the Proximal Iteratively REweighted (PIRE) algorithm for solving a general problem, which involves a large body of nonconvex sparse and structured sparse related problems. Comparing with previous iterative solvers for nonconvex sparse problem, PIRE is much more general and efficient. The computational cost of PIRE in each iteration is usually as low as the state-of-the-art convex solvers. We further propose the PIRE algorithm with Parallel Splitting (PIRE-PS) and PIRE algorithm with Alternative Updating (PIRE-AU) to handle the multi-variable problems. In theory, we prove that our proposed methods converge and any limit solution is a stationary point. Extensive experiments on both synthesis and real data sets demonstrate that our methods achieve comparative learning performance, but are much more efficient, by comparing with previous nonconvex solvers. Introduction This paper aims to solve the following general problem min x∈Rn F (x) = λf(g(x)) + h(x), (1) where λ > 0 is a parameter, and the functions in the above formulation satisfy the following conditions: C1 f(y) is nonnegative, concave and increasing. C2 g(x) : R → R is a nonnegative multi-dimensional function, such that the following problem min x∈Rn λ〈w,g(x)〉+ 1 2 ||x− b||2, (2) is convex and can be cheaply solved for any given nonnegative w ∈ R. C3 h(x) is a smooth function of type C, i.e., continuously differentiable with the Lipschitz continuous gradient ||∇h(x)−∇h(y)|| ≤ L(h)||x−y|| for any x,y ∈ R, (3) L(h) > 0 is called the Lipschitz constant of∇h. C4 λf(g(x)) + h(x)→∞ iff ||x||2 →∞. ∗Corresponding author. Copyright © 2014, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Note that problem (1) can be convex or nonconvex. Though f(y) is concave, f(g(x)) can be convex w.r.t x. Also f(y) and g(x) are not necessarily smooth, and h(x) is not necessarily convex. Based on different choices of f , g, and h, the general problem (1) involves many sparse representation models, which have many important applications in machine learning and computer vision (Wright et al. 2009; Beck and Teboulle 2009; Jacob, Obozinski, and Vert 2009; Gong, Ye, and Zhang 2012b). For the choice of h, the least square and logistic loss functions are two most widely used ones which satisfy (C3): h(x) = 1 2 ||Ax−b||2, or 1

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Proximal linearized iteratively reweighted least squares for a class of nonconvex and nonsmooth problems

For solving a wide class of nonconvex and nonsmooth problems, we propose a proximal linearized iteratively reweighted least squares (PL-IRLS) algorithm. We first approximate the original problem by smoothing methods, and second write the approximated problem into an auxiliary problem by introducing new variables. PL-IRLS is then built on solving the auxiliary problem by utilizing the proximal l...

متن کامل

Proximal iteratively reweighted algorithm for low-rank matrix recovery

This paper proposes a proximal iteratively reweighted algorithm to recover a low-rank matrix based on the weighted fixed point method. The weighted singular value thresholding problem gains a closed form solution because of the special properties of nonconvex surrogate functions. Besides, this study also has shown that the proximal iteratively reweighted algorithm lessens the objective function...

متن کامل

An Alternating Proximal Splitting Method with Global Convergence for Nonconvex Structured Sparsity Optimization

In many learning tasks with structural properties, structured sparse modeling usually leads to better interpretability and higher generalization performance. While great efforts have focused on the convex regularization, recent studies show that nonconvex regularizers can outperform their convex counterparts in many situations. However, the resulting nonconvex optimization problems are still ch...

متن کامل

A Weighted Two-Level Bregman Method with Dictionary Updating for Nonconvex MR Image Reconstruction

Nonconvex optimization has shown that it needs substantially fewer measurements than l 1 minimization for exact recovery under fixed transform/overcomplete dictionary. In this work, two efficient numerical algorithms which are unified by the method named weighted two-level Bregman method with dictionary updating (WTBMDU) are proposed for solving lp optimization under the dictionary learning mod...

متن کامل

Regularization: Convergence of Iterative Half Thresholding Algorithm

In recent studies on sparse modeling, the nonconvex regularization approaches (particularly, Lq regularization with q ∈ (0, 1)) have been demonstrated to possess capability of gaining much benefit in sparsity-inducing and efficiency. As compared with the convex regularization approaches (say, L1 regularization), however, the convergence issue of the corresponding algorithms are more difficult t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014